Goto

Collaborating Authors

 right amount


The Last of Us: TV finally has the perfect video game adaptation

The Guardian

The Last of Us came out in 2013 on the PlayStation 3 and is considered one of the best video games ever made. I know this because the week it came out I drew the curtains on the front room of my shared house, forbade all of my housemates from entering the zone unless they were going to watch in reverent silence, and completed it. In the game you play Joel – finally, some Joel representation! Every human you encounter is trying to stab you or scavenge bullets off you or recruit you to one side of a conflict between the citizen army and the underground uprising. Every monster you meet is infected with a brain fungus that makes them blind, bulbous and very bitey. But what made the game stand out was the story: Joel is escorting Ellie, a fungus-proof teenage girl and humanity's last hope, across a long trail that will take them both to safety.


Global Big Data Conference

#artificialintelligence

Tool use has long been a hallmark of human intelligence, as well as a practical problem to solve for a vast array of robotic applications. But machines are still wonky at exerting just the right amount of force to control tools that aren't rigidly attached to their hands. To manipulate said tools more robustly, researchers from MIT's Computer Science and Artificial Intelligence Laboratory (CSAIL), in collaboration with the Toyota Research Institute (TRI), have designed a system that can grasp tools and apply the appropriate amount of force for a given task, like squeegeeing up liquid or writing out a word with a pen. The system, dubbed Series Elastic End Effectors, or SEED, uses soft bubble grippers and embedded cameras to map how the grippers deform over a six-dimensional space (think of an airbag inflating and deflating) and apply force to a tool. Using six degrees of freedom, the object can be moved left and right, up or down, back and forth, roll, pitch, and yaw. The closed-loop controller--a self-regulating system that maintains a desired state without human interaction--uses SEED and visuotactile feedback to adjust the position of the robot arm in order to apply the desired force.


Soft robots that grip with the right amount of force

#artificialintelligence

Tool use has long been a hallmark of human intelligence, as well as a practical problem to solve for a vast array of robotic applications. But machines are still wonky at exerting just the right amount of force to control tools that aren't rigidly attached to their hands. To manipulate said tools more robustly, researchers from MIT's Computer Science and Artificial Intelligence Laboratory (CSAIL), in collaboration with the Toyota Research Institute (TRI), have designed a system that can grasp tools and apply the appropriate amount of force for a given task, like squeegeeing up liquid or writing out a word with a pen. The system, dubbed Series Elastic End Effectors, or SEED, uses soft bubble grippers and embedded cameras to map how the grippers deform over a six-dimensional space (think of an airbag inflating and deflating) and apply force to a tool. Using six degrees of freedom, the object can be moved left and right, up or down, back and forth, roll, pitch, and yaw.


Soft robots that grip with the right amount of force

#artificialintelligence

Tool use has long been a hallmark of human intelligence, as well as a practical problem to solve for a vast array of robotic applications. But machines are still wonky at exerting just the right amount of force to control tools that aren't rigidly attached to their hands. To manipulate said tools more robustly, researchers from MIT's Computer Science and Artificial Intelligence Laboratory (CSAIL), in collaboration with the Toyota Research Institute (TRI), have designed a system that can grasp tools and apply the appropriate amount of force for a given task, like squeegeeing up liquid or writing out a word with a pen. The system, dubbed Series Elastic End Effectors, or SEED, uses soft bubble grippers and embedded cameras to map how the grippers deform over a six-dimensional space (think of an airbag inflating and deflating) and apply force to a tool. Using six degrees of freedom, the object can be moved left and right, up or down, back and forth, roll, pitch, and yaw. The closed-loop controller -- a self-regulating system that maintains a desired state without human interaction -- uses SEED and visuotactile feedback to adjust the position of the robot arm in order to apply the desired force.


How to Use Conversational AI to Accelerate Revenue Growth for Your Company

#artificialintelligence

It can be difficult to make sure chats are responded to within the right amount of time after a customer asks a question. It can be expensive to staff live chat 24/7. However, only staffing during the day could result in lost interactions that drive revenue. Chat should not be put on every webpage on your site. Only place them on high-intent pages related to sales, or your sales team could get inundated with unrelated questions all day.


Microsoft Azure Taking A Bold Step in Transforming Agriculture

#artificialintelligence

Farmers can analyze a variety of things with thousands of data points collected from their farms about the climate, temperature, and soil conditions. This can help them decide which types of seeds to use considering the soil conditions at the time. Farmers can precisely target the weeds with AI sensors and can apply the right amount of herbicides needed to treat the most diseased crops. This improves the overall quality of their crop. With substantial amounts of data now available, farmers are able to create seasonal models that highly predict agricultural accuracy and productivity.


This Smart Robot Can See, Feel, and Think

#artificialintelligence

The robot brain has evolved and finally come to its senses. Researchers have created a smart robot brain that is closer to functioning like a human one by using neuromorphic computing -- mimicking the human mind with life-like brain functions. What's more, they've integrated the bot with artificial skin and sensors, giving it the ability to "see" and "feel." Humans can easily do things like grab their keys out or lift a glass to take a sip. They require an intuitive mind -- the smarts to look at the object, feel its shape, and grab and lift with just the right amount of force.


Amazon Elastic Inference adds support for PyTorch machine learning models - SiliconANGLE

#artificialintelligence

Amazon Web Services Inc. announced today that it's adding support for PyTorch models with its Amazon Elastic Inference service, which it said will help developers reduce the costs of deep learning inference by as much as 75% in some cases. Amazon Elastic Inference is a service launched in late 2018 that enables customers to attach graphics processing unit-powered inference acceleration to a standard Amazon EC2 instance. Inference refers to the process of making predictions using a trained deep learning model. PyTorch is an open-source machine learning library that was first developed by Facebook Inc. It's used primarily for applications such as computer vision and natural language processing.


Self-driving truck boss: 'Supervised machine learning doesn't live up to the hype. It isn't C-3PO, it's sophisticated pattern matching'

#artificialintelligence

Roundup Let's get cracking with some machine-learning news. Starksy Robotics is no more: Self-driving truck startup Starsky Robotics has shut down after running out of money and failing to raise more funds. CEO Stefan Seltz-Axmacher bid a touching farewell to his upstart, founded in 2016, in a Medium post this month. He was upfront and honest about why Starsky failed: "Supervised machine learning doesn't live up to the hype," he declared. Neural networks only learn to pick up on certain patterns after they are faced with millions of training examples.


Using AI to forecast resource supplies in natural disasters SciTech Europa

#artificialintelligence

A leading technology provider and data-driven consulting organisation, and the Schulich School of Business at York University have announced a partnership to create a predictive analytics model that identifies and forecasts supply and demand of necessary resources in a disaster-related emergency. The model evaluates existing wildfire data points and feeds into an ad hoc trading platform which key stakeholders can use to option the right amount of services and supplies in the most cost-effective manner. The project aims to bring together local governments, insurers and medical supplies providers to collaborate and plan proactively for optimal disaster management. Available in June 2020, the platform is the first in a series of analytics tools that the Schulich School of Business and Exigent will develop to deliver on their core focus: turning data into actionable business intelligence and community-centric analytics products. The collaboration is part of the Masters in Business Analytics Program (MBAN) at Schulich.